Digital Library


Search: "[ keyword: Distributed Deep Learning ]" (2)
    Dynamic Resource Adjustment Operator Based on Autoscaling for Improving Distributed Training Job Performance on Kubernetes
    Jinwon Jeong, Heonchang Yu KIPS Transactions on Computer and Communication Systems, Vol. 11, No. 7, pp. 205-216, Jul. 2022
    https://doi.org/10.3745/KTCCS.2022.11.7.205  
    Keywords: Kubeflow, Kubernetes, Distributed Deep Learning Training, Resource Adjustment Operator

    Hybrid All-Reduce Strategy with Layer Overlapping for Reducing Communication Overhead in Distributed Deep Learning
    Daehyun Kim, Sangho Yeo, Sangyoon Oh KIPS Transactions on Computer and Communication Systems, Vol. 10, No. 7, pp. 191-198, Jul. 2021
    10.3745/KTCCS.2021.10.7.191  
    Keywords: Distributed Deep Learning, Synchronization, Layer Overlapping, Allreduce